AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Contrastive Learning Pretraining

# Contrastive Learning Pretraining

Star
STAR is a supervised contrastive pretraining transformer model for social media writing style understanding.
Text Embedding Transformers
S
AIDA-UPM
186
2
Siamese Smole Bert Muv 1x
Apache-2.0
A neural language model toolkit for pretraining and fine-tuning SMILES-based molecular language models, supporting semi-supervised learning
Molecular Model Transformers
S
UdS-LSV
33
1
TOD XLMR
MIT
TOD-XLMR is a multilingual task-oriented dialogue model developed based on XLM-RoBERTa, employing a dual-objective joint training strategy to enhance dialogue understanding capabilities
Large Language Model Transformers Other
T
umanlp
54
2
Reacc Py Retriever
MIT
ReACC-py-retriever is a retrieval-augmented code completion model based on GraphCodeBERT, specifically designed for Python code retrieval and completion.
Text Embedding Transformers
R
microsoft
20
4
Semi Supervised Classification Simclr
Apache-2.0
A semi-supervised image classification model pretrained with SimCLR contrastive learning, trained on the STL-10 dataset with 10 object categories
Image Classification
S
keras-io
21
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase